On the Generalization Ability of Recurrent Networks

نویسنده

  • Barbara Hammer
چکیده

The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where the transition function forms a contraction or the probability of long inputs is restricted. For the general case, we derive posterior bounds which take the input data into account. They are obtained via a generalization of the luckiness framework to the agnostic setting. The general formalism allows to focus on reppresentative parts of the data as well as more general situations such as long term prediction.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Monitoring of Regional Low-Flow Frequency Using Artificial Neural Networks

Ecosystem of arid and semiarid regions of the world, much of the country lies in the sensitive and fragile environment Canvases are that factors in the extinction and destruction are easily destroyed in this paper, artificial neural networks (ANNs) are introduced to obtain improved regional low-flow estimates at ungauged sites. A multilayer perceptron (MLP) network is used to identify the funct...

متن کامل

Application of artificial neural networks on drought prediction in Yazd (Central Iran)

In recent decades artificial neural networks (ANNs) have shown great ability in modeling and forecasting non-linear and non-stationary time series and in most of the cases especially in prediction of phenomena have showed very good performance. This paper presents the application of artificial neural networks to predict drought in Yazd meteorological station. In this research, different archite...

متن کامل

Connection Reduction of the Recurrent Networks

There are many variations on the topology of recurrent networks. Models with fully-connected recurrent weights may not be superior to those models with sparsely connected recurrent weights in terms of capacity, time for training and generalization ability. In this paper, we show that the fully-connected recurrent networks can be reduced to functionally equivalent partially-connected recurrent n...

متن کامل

طراحی و آموزش شبکه‏ های عصبی مصنوعی به وسیله استراتژی تکاملی با جمعیت‏ های موازی

Application of artificial neural networks (ANN) in areas such as classification of images and audio signals shows the ability of this artificial intelligence technique for solving practical problems. Construction and training of ANNs is usually a time-consuming and hard process. A suitable neural model must be able to learn the training data and also have the generalization ability. In this pap...

متن کامل

Generalization In Simple Recurrent Networks

In this paper we examine Elman’s position (1999) on generalization in simple recurrent networks. Elman’s simulation is a response to Marcus et al.’s (1999) experiment with infants; specifically their ability to differentiate between novel sequences of syllables of the form ABA and ABB. Elman contends that SRNs can learn to generalize to novel stimuli, just as Marcus et al’s infants did. However...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001